Path Finder is a supportive navigation system created to help visually impaired individuals move more safely and confidently. Traditional tools, like white canes, often miss important details especially small obstacles on the ground or hazards at head level. This can lead to accidents and make navigating busy or unfamiliar places stressful. Path Finder addresses these challenges by using smart sensors and wireless communication to give users real-time awareness of their surroundings. The system is made up of two parts, Smart Footwear and Smart Eyewear. The Smart Footwear uses a VL53L0X Time-of-Flight sensor and an ESP32 microcontroller to detect steps, curbs, or uneven ground and send this information wirelessly. The Smart Eyewear, built with an ESP32-C3, warns the user about obstacles at head height. A small buzzer provides simple feedback the closer the obstacle, the faster or louder the sound. By working together, these two devices offer a practical, affordable solution that helps visually impaired users navigate more safely and independently.
Introduction
Path Finder is an innovative Electronic Travel Aid (ETA) designed to enhance independent mobility and safety for visually impaired individuals. Traditional aids like white canes, while useful, have limitations in detecting ground-level hazards such as curbs, steps, potholes, and uneven surfaces, and they require continuous manual effort. Path Finder overcomes these limitations by providing a smart, hands-free, and technology-assisted navigation solution.
The system consists of two coordinated modules: Smart Footwear and Smart Eyewear. The Smart Footwear serves as the sensing unit, using a high-precision VL53L0X Time-of-Flight (ToF) laser sensor controlled by an ESP32 microcontroller to accurately detect ground-level obstacles. Unlike ultrasonic sensors, ToF technology provides reliable distance measurements regardless of surface texture or lighting conditions. The Smart Eyewear acts as the feedback unit, powered by an ESP32-C3 microcontroller, which receives processed distance data wirelessly via Bluetooth Low Energy (BLE).
User feedback is delivered through a variable-frequency buzzer integrated into the eyewear. The system translates distance information into intuitive sound patterns: no sound in safe zones, slow beeps in caution zones, and continuous high-frequency alerts in danger zones. This gradual auditory feedback allows users to perceive obstacle proximity in real time without visual input or hand movement, reducing cognitive and physical strain.
Path Finder is lightweight, cable-free, energy-efficient, and powered by a portable lithium-polymer battery, making it suitable for daily use. Performance analysis shows that the system continuously monitors the walking path, detects hazards accurately, and provides timely alerts before the user encounters danger. Overall, Path Finder offers a practical, affordable, and user-friendly solution that significantly improves safety, confidence, and independence for visually impaired individuals navigating complex environments.
Conclusion
The development of the PathFinder system shows how modern sensing and wireless technologies can come together to meaningfully improve independent mobility for people with visual impairments. Instead of relying on conventional tools that have clear limitations, PathFinder uses advanced VL53L0X Time-of-Flight (ToF) sensing to detect ground-level hazards and elevation changes with high accuracy. This allows the system to identify steps, curbs, and uneven surfaces far more reliably than traditional ultrasonic solutions. The project successfully met its key technical goals. The smart footwear module precisely detects potential tripping hazards, delivering consistent performance regardless of environmental noise, lighting conditions, or surface color. By using Bluetooth Low Energy (BLE) communication between the shoe-mounted ESP32 and the eyewear-mounted ESP32-C3, the system removes the need for physical wires, resulting in a safer, lighter, and more comfortable user experience. Distance information is conveyed through a variable-frequency buzzer, transforming complex sensor data into clear, real-time audio cues that are easy for users to understand and respond to.
Although the current prototype primarily focuses on ground-level safety, it stands as a strong proof-of-concept for a distributed wearable assistance system. Its modular design opens the door to future enhancements, such as head-level obstacle detection, vibration-based haptic alerts, or expanded sensor coverage. Overall, PathFinder demonstrates that affordable, off-the-shelf components can be thoughtfully combined to create impactful assistive technology—empowering visually impaired individuals with greater confidence, safety, and independence in their daily lives.
References
[1] T. Annapurna et al., “An IoT-based vision alert for the blind using interdisciplinary approaches,” E3S Web Conf., vol. 507, Art. no. 01047, Mar. 2024, doi: 10.1051/e3sconf/202450701047.
[2] S. M. Joans, A. R., M. Sam, S. C. Ranjane, and S. Subiksha, “Artificial intelligence based smart navigation system for blind people,” Int. Res. J. Eng. Technol. (IRJET), vol. 9, no. 7, pp. 2295–2302, Jul. 2022.
[3] P. Kharat, T. Kumar, R. Sirsikar, R. Sawant, and V. Avhad, “Obstacle detection for visually impaired using computer vision,” Int. Res. J. Eng. Technol. (IRJET), vol. 10, no. 3, pp. 817–820, Mar. 2023.
[4] B. S. S. V. R. Babu, S. D. Basha, M. Chandrakanth, R. T. Teja, and P. Hemanth, “Smart blind glasses using OpenCV Python,” in Proc. IEEE Wireless Antenna Microw. Symp. (WAMS), Visakhapatnam, India, 2024, pp. 1–4, doi: 10.1109/WAMS59642.2024.10527868.
[5] P. Rajesh, R. Sairam, M. D. Kumar, P. K. Eswar, and Y. Keerthi, “Arduino based smart blind stick for people with vision loss,” in Proc. 7th Int. Conf. Comput. Methodol. Commun. (ICCMC), Erode, India, 2023, pp. 1501–1508, doi: 10.1109/ICCMC56507.2023.10083752.
[6] P. S. Ranaweera, S. H. R. Madhuranga, H. F. A. S. Fonseka, and D. M. L. D. Karunathilaka, “Electronic travel aid system for visually impaired people,” in Proc. 5th Int. Conf. Inf. Commun. Technol. (ICoICT), Melaka, Malaysia, 2017, pp. 1–6, doi: 10.1109/ICoICT.2017.8074700.
[7] B. Das and K. K. Halder, “Face recognition using ESP32-Cam for real-time tracking and monitoring,” in Proc. Int. Conf. Adv. Comput., Commun., Electr., Smart Syst. (iCACCESS), Dhaka, Bangladesh, 2024, pp. 1–6, doi: 10.1109/iCACCESS61735.2024.10499606.
[8] N. Tyagi, D. Sharma, J. Singh, B. Sharma, and S. Narang, “Assistive navigation system for visually impaired and blind people: A review,” in Proc. Int. Conf. Artif. Intell. Mach. Vis. (AIMV), Gandhinagar, India, 2021, pp. 1–5, doi: 10.1109/AIMV53313.2021.9670951.
[9] C. Prathima, G. N. Babu, B. R. Reddy, M. S. Kumar, S. Thousif, and P. Vamsi, “Smart Specs: Real-time object detection through embedded vision,” in Proc. Int. Conf. Data Sci., Agents Artif. Intell. (ICDSAAI), Chennai, India, 2025, pp. 1–6, doi: 10.1109/ICDSAAI65575.2025.11011858.
[10] D. Workshop, “ESP32-CAM object detection with Edge Impulse,” DroneBot Workshop, Jun. 2023.
[11] “ESP32-CAM AI-Thinker pinout guide: GPIOs usage explained,” Random Nerd Tutorials, Mar. 2020.